4,666 research outputs found

    Putting Science on trial

    Get PDF
    NO ABSTRAC

    Earthquake forecasting in Italy, before and after Umbria-Marche seismic sequence 1997. A review of the earthquake occurrence modeling at different spatio-temporal-magnitude scales.

    Get PDF
    The main goal of this work is to review the scientific researches carried out before and after the Umbria-Marche sequence related to the earthquake forecasting/prediction in Italy. In particular, I focus the attention on models that aim addressing three main practical questions: was (is) Umbria-Marche a region with high probability of occurrence of a destructive earthquake? Was a precursory activity recorded before the mainshock(s)? What was our capability to model the spatio-temporal-magnitude evolution of that seismic sequence? The models are reviewed pointing out what we have learned after the Umbria-Marche earthquakes, in terms of physical understanding of earthquake occurrence process, and of improving our capability to forecast earthquakes and to track in real-time seismic sequences

    On the earthquake predictability of fault interaction models

    Get PDF
    Space-time clustering is the most striking departure of large earthquakes occurrence process from randomness. These clusters are usually described ex-post by a physics-based model in which earthquakes are triggered by Coulomb stress changes induced by other surrounding earthquakes. Notwithstanding the popularity of this kind of modeling, its ex-ante skill in terms of earthquake predictability gain is still unknown. Here we show that even in synthetic systems that are rooted on the physics of fault interaction using the Coulomb stress changes, such a kind of modeling often does not increase significantly earthquake predictability. Earthquake predictability of a fault may increase only when the Coulomb stress change induced by a nearby earthquake is much larger than the stress changes caused by earthquakes on other faults and by the intrinsic variability of the earthquake occurrence process

    Long-term influence of giant earthquakes: backward empirical evidence and forward test

    Get PDF
    We investigate the capability of the strongest earthquakes to modify sig- nificantly the seismicity in a wide spatiotemporal window. In particular, we show that the strongest earthquakes of last century were probably able to influence the seismicity at large spatiotemporal distances, extending their reach over thousands of kilometers and decades later. We report statistically significant differences in worldwide seismi- city before and after the occurrence of the strongest earthquakes of the last century, whose perturbation is modeled by means of coseismic and postseismic stress varia- tions. This long-term coupling has produced time variations in worldwide seismic activity that appear related to the physical coupling between the focal mechanism of source earthquakes and the tectonic setting of each zone. These results could provide new important insights on seismic hazard assessment because they raise doubts on the validity of two paradigms; that is, the steadiness of the mainshock rate and the iso- lation of a seismic region from the surrounding areas. Finally, in addition to this back- ward analysis, we also provide a formal forward test by forecasting the effects on global seismicity of the recent Sumatraā€“Andaman earthquakes; this is maybe a unique chance to test the long-term hypothesis with an independent dataset that avoids, by definition, any kind of (often unconscious) optimization of the results that is an un- avoidable possibility in backward analyses

    Probabilistic eruption forecasting and the call for an evacuation

    Get PDF
    One of the most critical practical actions to reduce volcanic risk is the evacuation of people from threatened areas during volcanic unrest. Despite its importance, this decision is usually arrived at subjectively by a few individuals, with little quantitative decision support. Here, we propose a possible strategy to integrate a probabilistic scheme for eruption forecasting and cost-benefit analysis, with an application to the call for an evacuation of one of the highest risk volcanoes: Vesuvius. This approach has the following merits. First, it incorporates a decision-analysis framework, expressed in terms of event probability, accounting for all modes of available hazard knowledge. Secondly, it is a scientific tool, based on quantitative and transparent rules that can be tested. Finally, since the quantitative rules are defined during a period of quiescence, it allows prior scrutiny of any scientific input into the model, so minimizing the external stress on scientists during an actual emergency phase. Whilst we specifically report the case of Vesuvius during the MESIMEX exercise, the approach can be generalized to other types of natural catastrophe

    The Assumption of Poisson Seismic-Rate Variability in CSEP/RELM Experiments

    Get PDF
    Evaluating the performances of earthquake forecasting/prediction models is the main rationale behind some recent international efforts like the Regional Earthquake Likelihood Model (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). Basically, the evaluation process consists of two steps: 1) to run simultaneously all codes to forecast future seismicity in well-deļ¬ned testing regions; 2) to compare the forecasts through a suite of statistical tests. The tests are based on the likelihood score and they check both the time and space performances. All these tests rely on some basic assumptions that have never been deeply discussed and analyzed. In particular, models are required to specify a rate in space-time-magnitude bins, and it is assumed that these rates are independent and characterized by Poisson uncertainty. In this work we have explored in detail these assumptions and their impact on CSEP testing procedures when applied to a widely used class of models, i.e., the Epidemic-Type Aftershock Sequence (ETAS) models. Our results show that, if an ETAS model is an accurate representation of seismicity, the same "right" model is rejected by the current CSEP testing procedures a number of times significantly higher than expected. We show that this deficiency is due to the fact that the ETAS models produce forecasts with a variability significantly higher than that of a Poisson process, invalidating one of the main assumption that stands behind the CSEP/RELM evaluation process. Certainly, this shortcoming does not negate the paramount importance of the CSEP experiments as a whole, but it does call for a specific revision of the testing procedures to allow a better understanding of the results of such experiments

    A review and new insights on the estimation of the b-valueand its uncertainty

    Get PDF
    The estimation of the b-value of the Gutenberg-Richter Law and its uncertainty is crucial in seismic hazard studies, as well as in verifying theoretical assertions, such as, for example, the universality of the Gutenberg-Richter Law. In spite of the importance of this issue, many scientific papers still adopt formulas that lead to different estimations. The aim of this paper is to review the main concepts relative to the estimation of the b-value and its uncertainty, and to provide some new analytical and numerical insights on the biases introduced by the unavoidable use of binned magnitudes, and by the measurement errors on the magnitude. It is remarked that, although corrections for binned magnitudes were suggested in the past, they are still very often neglected in the estimation of the b-value, implicitly by assuming that the magnitude is a continuous random variable. In particular, we show that: i) the assumption of continuous magnitude can lead to strong bias in the b-value estimation, and to a significant underestimation of its uncertainty, also for binning of ?M = 0.1; ii) a simple correction applied to the continuous formula causes a drastic reduction of both biases; iii) very simple formulas, until now mostly ignored, provide estimations without significant biases; iv) the effect on the bias due to the measurement errors is negligible compared to the use of binned magnitudes

    A technical note on the bias in the estimation of the b-value and its uncertainty through the Least Squares technique

    Get PDF
    We investigate conceptually, analytically, and numerically the biases in the estimation of the b-value of the Gutenberg-Richter Law and of its uncertainty made through the least squares technique. The biases are introduced by the cumulation operation for the cumulative form of the Gutenberg-Richter Law, by the logarithmic transformation, and by the measurement errors on the magnitude. We find that the least squares technique, applied to the cumulative and binned form of the Gutenberg-Richter Law, produces strong bias in the b-value and its uncertainty, whose amplitudes depend on the size of the sample. Furthermore, the logarithmic transformation produces two different endemic bends in the Log(N) versus M curve. This means that this plot might produce fake significant departures from the Gutenberg-Richter Law. The effect of the measurement errors is negligible compared to those of cumulation operation and logarithmic transformation. The results obtained show that the least squares technique should never be used to determine the slope of the Gutenberg-Richter Law and its uncertainty

    A completeness analysis of the national seismic network of Italy

    Get PDF
    We present the first detailed study of earthquake detection capabilities of the Italian National Seismic Network and of the completeness threshold of its earthquake catalog. The network in its present form started operating on 16 April 2005 and is a significant improvement over the previous networks. For our analysis, we employed the PMC method as introduced by Schorlemmer and Woessner (2008). This method does not estimate completeness from earthquakes samples as traditional methods, mostly based on the linearity of earthquake-size distributions. It derives detection capabilities for each station of the network and synthesizes them into maps of detection probabilities for earthquakes of a given magnitude. Thus, this method avoids the many assumptions about earthquake distributions that traditional methods make. The results show that the Italian National Seismic Network is complete at M=2.9 for the entire territory excluding the islands of Sardinia, Pantelleria, and Lampedusa. At the M=2.5 level, which is the reporting threshold level of the Italian Civil Protection, the network may miss events in southern parts of Apulia and the western part of Sicily. The stations are connected through many different telemetry links to the operational datacenter in Rome. Scenario computations show that no significant drop in completeness occurs if one of the three major links fail, indicating a well-balanced network setup
    • ā€¦
    corecore